Nearly Tight Bounds on $\ell_1$ Approximation of Self-Bounding Functions

نویسندگان

  • Vitaly Feldman
  • Pravesh Kothari
  • Jan Vondr'ak
چکیده

We study the complexity of learning and approximation of self-bounding functions over the uniform distribution on the Boolean hypercube {0, 1}n. Informally, a function f : {0, 1}n → R is self-bounding if for every x ∈ {0, 1}n, f(x) upper bounds the sum of all the n marginal decreases in the value of the function at x. Self-bounding functions include such well-known classes of functions as submodular and fractionally-subadditive (XOS) functions. They were introduced by Boucheron et al. in the context of concentration of measure inequalities [BLM00]. Our main result is a nearly tight l1-approximation of selfbounding functions by low-degree juntas. Specifically, all self-bounding functions can be ǫ-approximated in l1 by a polynomial of degree Õ(1/ǫ) over 2 Õ(1/ǫ) variables. Both the degree and junta-size are optimal up to logarithmic terms. Previously, the best known bound was O(1/ǫ) on the degree and 2 ) on the number of variables [FV13]. These results lead to improved and in several cases almost tight bounds for PAC and agnostic learning of submodular, XOS and self-bounding functions. In particular, assuming hardness of learning juntas, we show that PAC and agnostic learning of self-bounding functions have complexity of nΘ̃.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nearly Tight Bounds on ℓ1 Approximation of Self-Bounding Functions

We study the complexity of learning and approximation of self-bounding functions over the uniform distribution on the Boolean hypercube {0, 1}. Informally, a function f : {0, 1} → R is self-bounding if for every x ∈ {0, 1}, f(x) upper bounds the sum of all the n marginal decreases in the value of the function at x. Self-bounding functions include such well-known classes of functions as submodul...

متن کامل

Error bounds in approximating n-time differentiable functions of self-adjoint operators in Hilbert spaces via a Taylor's type expansion

On utilizing the spectral representation of selfadjoint operators in Hilbert spaces, some error bounds in approximating $n$-time differentiable functions of selfadjoint operators in Hilbert Spaces via a Taylor's type expansion are given.

متن کامل

Quasi{lumpability, Lower Bounding Coupling Matrices, and Nearly Completely Decomposable Markov Chains

In this paper, it is shown that nearly completely decomposable (NCD) Markov chains are quasi{lumpable. The state space partition is the natural one, and the technique may be used to compute lower and upper bounds on the stationary probability of each NCD block. In doing so, a lower bounding nonnegative coupling matrix is employed. The nature of the stationary probability bounds is closely relat...

متن کامل

Barycentric Bounds in Stochastic Programming: Theory and Application

The design and analysis of efficient approximation schemes is of fundamental importance in stochastic programming research. Bounding approximations are particularly popular for providing strict error bounds that can be made small by using partitioning techniques. In this article we develop a powerful bounding method for linear multistage stochastic programs with a generalized nonconvex dependen...

متن کامل

Tight bounds on rates of variable-basis approximation

Tight bounds on the approximation rates of nonlinear approximation by variable-basis functions, which include feedforward neural networks, are investigated. The connections with recent results on neural network approximation are discussed.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014